Construction of Decision Tree for Insurance Policy System through Entropy and GINI Index

نویسندگان

  • Narander Kumar
  • Vishal Verma
  • Vipin Saxena
چکیده

In the modern age of computing, sparse and irregularity in a sample of data is needed for reconstruction of the large and different types of dimensions of data. However, there is a challenge to analyze this type of data. One issue is the robust selection of data. There are many analytical tools are available but crucial part is the decomposition, and make the clusters as well as gradient estimated of data. When one extracts the attributes of sparsely sample data then most likely common attributes may lead to inaccurate results. Due to these, the present paper consists of the solutions through entropy calculation and GINI Index calculation of an insurance company. After calculation of entropy and GINI Index, a decision tree of sample data of an insurance company is presented.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Application of Different Methods of Decision Tree Algorithm for Mapping Rangeland Using Satellite Imagery (Case Study: Doviraj Catchment in Ilam Province)

Using satellite imagery for the study of Earth's resources is attended by manyresearchers. In fact, the various phenomena have different spectral response inelectromagnetic radiation. One major application of satellite data is the classification ofland cover. In recent years, a number of classification algorithms have been developed forclassification of remote sensing data. One of the most nota...

متن کامل

Unifying Decision Trees Split Criteria Using Tsallis Entropy

The construction of efficient and effective decision trees remains a key topic in machine learning because of their simplicity and flexibility. A lot of heuristic algorithms have been proposed to construct near-optimal decision trees. Most of them, however, are greedy algorithms which have the drawback of obtaining only local optimums. Besides, common split criteria, e.g. Shannon entropy, Gain ...

متن کامل

Statistical Sources of Variable Selection Bias in Classification Tree Algorithms Based on the Gini Index

Evidence for variable selection bias in classification tree algorithms based on the Gini Index is reviewed from the literature and embedded into a broader explanatory scheme: Variable selection bias in classification tree algorithms based on the Gini Index can be caused not only by the statistical effect of multiple comparisons, but also by an increasing estimation bias and variance of the spli...

متن کامل

SLEAS: Supervised Learning using Entropy as Attribute Selection Measure

There is embryonic importance in scaling up the broadly used decision tree learning algorithms to huge datasets. Even though abundant diverse methodologies have been proposed, a fast tree growing algorithm without substantial decrease in accuracy and substantial increase in space complexity is essential to a greater extent. This paper aims at improving the performance of the SLIQ (Supervised Le...

متن کامل

A Mean Deviation Based Splitting Criterion for Classification Tree

For the learning of Classification Tree, many researchers have used different splitting criteria, in which most commonly impurity-based criteria are: Gini index, Entropy function and Exponent-based index. By comparing Misclassification rates, none of the splitting criterion can be declared as providing best results in every situation. In this study, a new Mean Deviation based index has been pro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013